Lasso inference for high-dimensional time series

نویسندگان

چکیده

In this paper we develop valid inference for high-dimensional time series. We extend the desparsified lasso to a series setting under Near-Epoch Dependence (NED) assumptions allowing non-Gaussian, serially correlated and heteroskedastic processes, where number of regressors can possibly grow faster than dimension. first derive an error bound weak sparsity, which, coupled with NED assumption, means inequality also be applied (inherently misspecified) nodewise regressions performed in lasso. This allows us establish uniform asymptotic normality general conditions, including on parameters increasing dimensions. Additionally, show consistency long-run variance estimator, thus providing complete set tools performing linear models. Finally, perform simulation exercise demonstrate small sample properties common settings.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modeling High Dimensional Time Series

This paper investigates the effectiveness of the recently proposed Gaussian Process Dynamical Model (GPDM) on high dimensional chaotic time series. The GPDM takes a Bayesian approach to modeling high-dimensional time series data, using the Gaussian process Latent Variable model (GPLVM) for nonlinear dimensionality reduction combined with a nonlinear dynamical model in latent space. The GPDM is ...

متن کامل

Factor Modeling for High-Dimensional Time Series: Inference for the Number of Factors

This paper deals with the factor modeling for high-dimensional time series based on a dimension-reduction viewpoint. Under stationary settings, the inference is simple in the sense that both the number of factors and the factor loadings are estimated in terms of an eigenanalysis for a non-negative definite matrix, and is therefore applicable when the dimension of time series is in the order of ...

متن کامل

Thresholded Lasso for High Dimensional Variable Selection

Given n noisy samples with p dimensions, where n " p, we show that the multi-step thresholding procedure based on the Lasso – we call it the Thresholded Lasso, can accurately estimate a sparse vector β ∈ R in a linear model Y = Xβ + ", where Xn×p is a design matrix normalized to have column #2-norm √ n, and " ∼ N(0,σIn). We show that under the restricted eigenvalue (RE) condition (BickelRitov-T...

متن کامل

Localized Lasso for High-Dimensional Regression

We introduce the localized Lasso, which is suited for learning models that both are interpretable and have a high predictive power in problems with high dimensionality d and small sample size n. More specifically, we consider a function defined by local sparse models, one at each data point. We introduce sample-wise network regularization to borrow strength across the models, and sample-wise ex...

متن کامل

Variational Inference for On-line Anomaly Detection in High-Dimensional Time Series

Approximate variational inference has shown to be a powerful tool for modeling unknown complex probability distributions. Recent advances in the field allow us to learn probabilistic models of sequences that actively exploit spatial and temporal structure. We apply a Stochastic Recurrent Network (STORN) to learn robot time series data. Our evaluation demonstrates that we can robustly detect ano...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Econometrics

سال: 2023

ISSN: ['1872-6895', '0304-4076']

DOI: https://doi.org/10.1016/j.jeconom.2022.08.008